AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Chinese NLP Optimization

# Chinese NLP Optimization

Chinese Roberta Wwm Ext Large
Apache-2.0
A Chinese pre-trained BERT model employing whole word masking strategy, designed to accelerate Chinese natural language processing research.
Large Language Model Chinese
C
hfl
30.27k
200
Chinese Legal Electra Small Generator
Apache-2.0
Chinese ELECTRA is a Chinese pre-trained model released by the HIT-iFLYTEK Joint Lab based on Google's ELECTRA model, featuring compact size and superior performance.
Large Language Model Transformers Chinese
C
hfl
14
4
Chinese Macbert Base
Apache-2.0
MacBERT is an improved BERT model that uses a novel MLM as a corrective masked language model pre-training task, alleviating the discrepancy between pre-training and fine-tuning stages.
Large Language Model Chinese
C
hfl
22.48k
132
Roberta Mini Word Chinese Cluecorpussmall
A Chinese word-level RoBERTa medium model pretrained on CLUECorpusSmall, outperforming character-based models in multiple tasks
Large Language Model Chinese
R
uer
44
1
Chinese Roberta Wwm Ext
Apache-2.0
A Chinese pretrained BERT model using whole word masking technology, designed to accelerate the development of Chinese natural language processing.
Large Language Model Chinese
C
hfl
96.54k
324
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase